The Spectrum of Random Inner-product Kernel Matrices

نویسندگان

  • Xiuyuan Cheng
  • Amit Singer
چکیده

Abstract: We consider n-by-n matrices whose (i, j)-th entry is f(XT i Xj), where X1, . . . ,Xn are i.i.d. standard Gaussian random vectors in Rp, and f is a real-valued function. The eigenvalue distribution of these random kernel matrices is studied at the “large p, large n” regime. It is shown that, when p, n → ∞ and p/n = γ which is a constant, and f is properly scaled so that V ar(f(XT i Xj)) is O(p ), the spectral density converges weakly to a limiting density on R. The limiting density is dictated by a cubic equation involving its Stieltjes transform. While for smooth kernel functions the limiting spectral density has been previously shown to be the Marcenko-Pastur distribution, our analysis is applicable to non-smooth kernel functions, resulting in a new family of limiting densities.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Spectral Norm of Random Kernel Matrices with Applications to Privacy

Kernel methods are an extremely popular set of techniques used for many important machine learning and data analysis applications. In addition to having good practical performance, these methods are supported by a well-developed theory. Kernel methods use an implicit mapping of the input data into a high dimensional feature space defined by a kernel function, i.e., a function returning the inne...

متن کامل

SVM learning with the Schur-Hadamard inner product for graphs

We apply support vector learning to attributed graphs where the kernel matrices are based on approximations of the Schur-Hadamard inner product. The evaluation of the Schur-Hadamard inner product for a pair of graphs requires the determination of an optimal match between their nodes and edges. It is therefore efficiently approximated by means of recurrent neural networks. The optimal mapping in...

متن کامل

New Probabilistic Bounds on Eigenvalues and Eigenvectors of Random Kernel Matrices

Kernel methods are successful approaches for different machine learning problems. This success is mainly rooted in using feature maps and kernel matrices. Some methods rely on the eigenvalues/eigenvectors of the kernel matrix, while for other methods the spectral information can be used to estimate the excess risk. An important question remains on how close the sample eigenvalues/eigenvectors a...

متن کامل

Gap Probabilities for Double Intervals in Hermitian Random Matrix Ensembles as τ-Functions – Spectrum Singularity case

Department of Mathematics and Statistics and School of Physics, University of Melbourne, Victoria 3010, Australia Email: [email protected] The probability for the exclusion of eigenvalues from an interval (−x, x) symmetrical about the origin for a scaled ensemble of Hermitian random matrices, where the Fredholm kernel is a type of Bessel kernel with parameter a (a generalisation of the ...

متن کامل

On the Universality for Orthogonal Ensembles of Random Matrices

We prove universality of local eigenvalue statistics in the bulk of the spectrum for orthogonal invariant matrix models with real analytic potentials with one interval limiting spectrum. Our starting point is the Tracy-Widom formula for the matrix reproducing kernel. The key idea of the proof is to represent the differentiation operator matrix written in the basis of orthogonal polynomials as a...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2012